Universality of Fully-Connected Recurrent Neural Networks

نویسنده

  • Kenji Doya
چکیده

It is shown from the universality of multi-layer neural networks that any discretetime or continuous-time dynamical system can be approximated by discrete-time or continuous-time recurrent neural networks, respectively.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fault Detection and Location in DC Microgrids by Recurrent Neural Networks and Decision Tree Classifier

Microgrids have played an important role in distribution networks during recent years.  DC microgrids are very popular among researchers because of their benefits. Protection is one of the significant challenges in the way of microgrids progress. As a result, in this paper, a fault detection and location scheme for DC microgrids is proposed. Due to advances in Artificial Intelligence (AI) and s...

متن کامل

Connection Reduction of the Recurrent Networks

There are many variations on the topology of recurrent networks. Models with fully-connected recurrent weights may not be superior to those models with sparsely connected recurrent weights in terms of capacity, time for training and generalization ability. In this paper, we show that the fully-connected recurrent networks can be reduced to functionally equivalent partially-connected recurrent n...

متن کامل

Estimation of Network Reliability for a Fully Connected Network with Unreliable Nodes and Unreliable Edges using Neuro Optimization

In this paper it is tried to estimate the reliability of a fully connected network of some unreliable nodes and unreliable connections (edges) between them. The proliferation of electronic messaging has been witnessed during the last few years. The acute problem of node failure and connection failure is frequently encountered in communication through various types of networks. We know that a ne...

متن کامل

Phoneme Probability Estimation with Dynamic Sparsely Connected Artificial Neural Networks

This paper presents new methods for training large neural networks for phoneme probability estimation. An architecture combining time-delay windows and recurrent connections is used to capture the important dynamic information of the speech signal. Because the number of connections in a fully connected recurrent network grows super-linear with the number of hidden units, schemes for sparse conn...

متن کامل

Robust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays

In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1993